Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Multimodal emotion recognition method based on multiscale convolution and self-attention feature fusion
Tian CHEN, Conghu CAI, Xiaohui YUAN, Beibei LUO
Journal of Computer Applications    2024, 44 (2): 369-376.   DOI: 10.11772/j.issn.1001-9081.2023020185
Abstract203)   HTML15)    PDF (2138KB)(189)       Save

Emotion recognition based on physiological signals is affected by noise and other factors, resulting in low accuracy and weak cross-individual generalization ability. Concerning the issue, a multimodal emotion recognition method based on ElectroEncephaloGram (EEG), ElectroCardioGram (ECG), and eye movement signals was proposed. Firstly, physiological signals were performed multi-scale convolution to obtain higher-dimensional signal features and reduce parameter size. Secondly, self-attention was employed in the fusion of multimodal signal features to enhance the weights of key features and reduce feature interference between modalities. Finally, a Bi-directional Long Short-Term Memory (Bi-LSTM) network was used for extraction of temporal information of fused features and classification. Experimental results show that, the proposed method achieves recognition accuracies of 90.29%, 91.38%, and 83.53% for valence, arousal, and valence/arousal four-class recognition tasks, respectively, with improvements of 3.46-7.11 and 0.92-3.15 percentage points compared to the EEG single-modality and EEG+ECG bimodal methods. The proposed method can accurately recognize emotion with better recognition stability between individuals.

Table and Figures | Reference | Related Articles | Metrics
Bimodal emotion recognition method based on graph neural network and attention
Lubao LI, Tian CHEN, Fuji REN, Beibei LUO
Journal of Computer Applications    2023, 43 (3): 700-705.   DOI: 10.11772/j.issn.1001-9081.2022020216
Abstract661)   HTML52)    PDF (1917KB)(551)       Save

Considering the issues of physiological signal emotion recognition, a bimodal emotion recognition method based on Graph Neural Network (GNN) and attention was proposed. Firstly, the GNN was used to classify ElectroEncephaloGram (EEG) signals. Secondly, an attention-based Bi-directional Long Short-Term Memory (Bi-LSTM) network was used to classify ElectroCardioGram (ECG) signals. Finally, the results of EEG and ECG classification were fused by Dempster-Shafer evidence theory, thus improving the comprehensive performance of the emotion recognition task. To verify the effectiveness of the proposed method, 20 subjects were invited to participate in the emotion elicitation experiment, and the EEG signals and ECG signals of the subjects were collected. Experimental results show that the binary classification accuracies of the proposed method are 91.82% and 88.24% in the valence dimension and arousal dimension, respectively, which are 2.65% and 0.40% higher than those of the single-modal EEG method respectively, and are 19.79% and 24.90% higher than those of the single-modal ECG method respectively. It can be seen that the proposed method can effectively improve the accuracy of emotion recognition and provide decision support for medical diagnosis and other fields.

Table and Figures | Reference | Related Articles | Metrics